Sparse model identification using orthogonal forward regression with basis pursuit and D-optimality - Control Theory and Applications, IEE Proceedings-

نویسنده

  • X. Hong
چکیده

An efficient model identification algorithm for a large class of linear-in-the-parameters models is introduced that simultaneously optimises the model approximation ability, sparsity and robustness. The derived model parameters in each forward regression step are initially estimated via the orthogonal least squares (OLS), followed by being tuned with a new gradient-descent learning algorithm based on the basis pursuit that minimises the l norm of the parameter estimate vector. The model subset selection cost function includes a D-optimality design criterion that maximises the determinant of the design matrix of the subset to ensure model robustness and to enable the model selection procedure to automatically terminate at a sparse model. The proposed approach is based on the forward OLS algorithm using the modified Gram–Schmidt procedure. Both the parameter tuning procedure, based on basis pursuit, and the model selection criterion, based on the D-optimality that is effective in ensuring model robustness, are integrated with the forward regression. As a consequence the inherent computational efficiency associated with the conventional forward OLS approach is maintained in the proposed algorithm. Examples demonstrate the effectiveness of the new approach.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse multioutput radial basis function network construction using combined locally regularised orthogonal least square and D-optimality experimental des - Control Theory and Applications, IEE Proceedings-

A construction algorithm for multioutput radial basis function (RBF) network modelling is introduced by combining a locally regularised orthogonal least squares (LROLS) model selection with a D-optimality experimental design. The proposed algorithm aims to achieve maximised model robustness and sparsity via two effective and complementary approaches. The LROLS method alone is capable of produci...

متن کامل

An Orthogonal Forward Regression Algorithm Combined with Basis Pursuit and D-optimality

A new forward regression model identification algorithm is introduced. The derived model parameters, in each forward regression step, are initially estimated via orthogonal least squares (OLS) (using the modified Gram-Schmidt procedure), followed by being tuned with a new gradient descent learning algorithm based on the basis pursuit that minimizes the norm of the parameter estimate vector. The...

متن کامل

Fully complex-valued radial basis function networks: Orthogonal least squares regression and classification

We consider a fully complex-valued radial basis function (RBF) network for regression and classification applications. For regression problems, the locally regularised orthogonal least squares (LROLS) algorithm aided with the D-optimality experimental design, originally derived for constructing parsimonious real-valued RBF models, is extended to the fully complex-valued RBF (CVRBF) network. Lik...

متن کامل

Multi-output regression using a locally regularised orthogonal least-squares algorithm - Vision, Image and Signal Processing, IEE Proceedings-

The paper considcrs data modelling using multi-output regression models. A locally regularised orthogonal least-squares (LROLS) algorithm is proposed for constructing sparse multi-output regression models that generalise well. By associating each regressor in the regression model with an individual regularisation parameter, the ability of the multi-output orthogonal least-squares (OLS) model se...

متن کامل

Backward Elimination Methods for Associative Memory Network Pruning

Three hybrid data based model construction/pruning formula are introduced by using backward elimination as automatic postprocessing approaches to improved model sparsity. Each of these approaches is based on a composite cost function between the model fit and one of three terms of A-/D-optimality / (parameter 1-norm in basis pursuit) that determines a pruning process. The A-/D-optimality based ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001